In [3]:
import sys
sys.path.append('..')

import random

Pynetics QuickStart

In this example we are going to build a very simple and useless algorithm to explore the possibilities of the pynetics library.

Our problem will be as follows. We'll going to develop a genetic algorithm to find the what binary list of lenght $L=N$ is the one with the bigger sum. Yes, total absurd, but useful to learn GAs and this library.

Let's start from the begining. The individuals.

Representing individuals

The individuals are the most important component in a genetic algorithm. Each individual is a possible solution, good or bad, for our problem.

We want to model an individual capable of represent a possible solution for our problem. Pynetics have a perfect representation for this problem, the BinaryIndividual, so it's no necessary to create a custom individual. We'll cross that bridge when we get to it.

The algorithm will create individuals using a SpawningPool implementation. We're going to use a special implementation inside the module pynetics.ga_bin called BinaryIndividualSpawningPool, that creates BinaryIndividual instances of the given size.


In [4]:
from pynetics.ga_bin import BinaryIndividualSpawningPool

# Let's define the size of our individuals (the numer of 1's and 0's)
individual_size = 25
binary_individual_spawning_pool=BinaryIndividualSpawningPool(size=individual_size)

Now the spawning pool will be capable of creating individuals of the specified size. The genetic algorithm will create a population of individuals using the spawn method to populate it. We'll also specify a population size for the algorithm and see an example of population:


In [11]:
population_size = 10

for i in range(population_size):
    individual = binary_individual_spawning_pool.spawn()
    print(i, '->', individual)


0 -> 1101011010001111000010101
1 -> 1100001010000010101001111
2 -> 0001000110001110110011100
3 -> 1010111011011011010001101
4 -> 0110110110010100100111010
5 -> 1001101000011000111001110
6 -> 1011100110000010010111110
7 -> 0011110101011001011110010
8 -> 1110001000100000000100101
9 -> 0100001101010000001110101

Fitness

Our individuals are solutions for the problem but, ¿how can we measure how good or how bad are they? That is what the fitness is for. It's a function that will return a float value. The bigger the value, the better the individual is.

We could use a fitness function equals to the sum of al $1$'s but if we want to stop the algorithm based on the fitness, is not the same the best fitness for an individual of size 10 than an individual of size 20.

So the fitness funcion we're gonna use is a function with the form $1 / (1 + \alpha)$, being $\alpha$ the error of our individual. The error will be computed as the number of $0$'s the individual has.


In [12]:
def maximize_ones_fitness(individual):
    error = len(individual) - sum(individual)
    return 1 / (1 + error)

This function guarantees that the fitness will belong to the $(0, 1]$ interval. Let's see an example of how it behaves.


In [17]:
for i in range(population_size):
    individual = binary_individual_spawning_pool.spawn()
    fitness = maximize_ones_fitness(individual)
    print(i, '->', individual, fitness)


0 -> 0110101011110010111000101 0.08333333333333333
1 -> 1010100010010100011001001 0.0625
2 -> 1110001011110011111100100 0.09090909090909091
3 -> 1111011001100110000000111 0.07692307692307693
4 -> 1110010111000011000010110 0.07142857142857142
5 -> 0110001110000110011010110 0.07142857142857142
6 -> 0001010000111110010111101 0.07692307692307693
7 -> 0110010111000111000100001 0.06666666666666667
8 -> 1001100000001111000111000 0.0625
9 -> 1110010010111000111100100 0.07692307692307693

The stop condition

Now we're gonna specify when our algorithm should stop. This is controlled by a stop condition.


In [18]:
from pynetics.stop import FitnessBound

fitness_stop_condition = FitnessBound(1)

Instances of the class FitnessBound are created by specifying the fitness thresshold above which we can stop our algorithm. We have specified a FitnessBound object with a thressholdd of $1$. That means that all the values below $1$ will not stop our algorithm whereas all the values upper or equal than $1$ will do.

Because our fitness value belongs to the $(0, 1]$ interval, the algorithm will stop only when the population have an individual with a fitness of $1$ (all $1$'s).

Selecting individuals

For our GA, we're going to use a tournament selection. Tournament selection works by selecting $n$ individuals randomly from the population and then returning the best of then (based on their fitnesses).


In [19]:
from pynetics.selections import Tournament

tournament_selection = Tournament(2)

Recombining

Now the recombination, i.e. the step where the individuals are selected and their genetic informatio is going to be inherited by their progeny.

We'll use a OnePointRecombination, included in the module ga_list. Also, for the recommender we'll going to specify the probability for two individuals to mate to be 1, that is, they always mate.


In [20]:
from pynetics.ga_list import OnePointRecombination

recombination_probability = 1
recombination = OnePointRecombination()

Mutations

The same with mutations. The mutation operator we're gona use is AllGenesCanSwitch, a mutation where for each binary gene there is a probability to be switched from $0$ to $1$ and viceversa. It belongs to the module ga_bin.


In [21]:
from pynetics.ga_bin import AllGenesCanSwitch

mutation_probability = 1 / individual_size
mutation = AllGenesCanSwitch()

Replacement

Once we've got the offspring, we need to replace the population with these new borns. The operator for that matter will be a LowElitism operator, where the worst individuals of the popularin are replaced by the offspring.

We'll fix the replacement rate in $0.9$, i.e. a $90\%$ of the pooulation will be replaced for each iteration of the loop.


In [22]:
from pynetics.replacements import LowElitism

replacement_rate = 0.9
replacement = LowElitism()

The algorithm


In [42]:
from pynetics.algorithms import SimpleGA

ga = SimpleGA(
    stop_condition=fitness_stop_condition,
    population_size=population_size,
    fitness=maximize_ones_fitness,
    spawning_pool=binary_individual_spawning_pool,
    selection=tournament_selection,
    recombination=recombination,
    mutation=mutation,
    replacement=replacement,
    p_recombination=recombination_probability,
    p_mutation=mutation_probability,
    replacement_rate=replacement_rate,
)

Now we've created our algorithm, can run it to find the right solution. Let's see how it works


In [43]:
ga.run()
print(ga.best())


1111111111111111111111111

We can specify functions to be executed while the training takes place. The next example adds some of those functions.


In [44]:
ga.on_start(
    lambda ga: print('Starting genetic algoritm')
).on_end(
    lambda ga: print('Genetic Algorithm ended. Best individual:', ga.best())
).on_step_start(
    lambda ga: print('Step:', ga.generation, '->', end='')
).on_step_end(
    lambda ga: print(ga.best(), 'fitness:', ga.best().fitness())
)
ga.run()


Starting genetic algoritm
Step: 0 ->1001001110101110010111110 fitness: 0.09090909090909091
Step: 1 ->1001001111101001111110011 fitness: 0.1
Step: 2 ->1011001110101011110111011 fitness: 0.1111111111111111
Step: 3 ->1111100101111011110111100 fitness: 0.125
Step: 4 ->1111100101111111111110100 fitness: 0.14285714285714285
Step: 5 ->1111100101111111111110100 fitness: 0.14285714285714285
Step: 6 ->1111100101111111111110100 fitness: 0.14285714285714285
Step: 7 ->1111100111111111111110011 fitness: 0.2
Step: 8 ->1111100111111111111110011 fitness: 0.2
Step: 9 ->1111100111111111111111011 fitness: 0.25
Step: 10 ->1111101111111111111111011 fitness: 0.3333333333333333
Step: 11 ->1111111111111111111111011 fitness: 0.5
Step: 12 ->1111111111111111111111011 fitness: 0.5
Step: 13 ->1111111111111111111111011 fitness: 0.5
Step: 14 ->1111111111111111111111011 fitness: 0.5
Step: 15 ->1111111111111111111111011 fitness: 0.5
Step: 16 ->1111111111111111111111111 fitness: 1.0
Genetic Algorithm ended. Best individual: 1111111111111111111111111

And here ends the quickstart tutorial!


In [ ]: